Goto

Collaborating Authors

 thermostat-assisted continuously-tempered hamiltonian monte carlo



Thermostat-assisted continuously-tempered Hamiltonian Monte Carlo for Bayesian learning

Neural Information Processing Systems

In this paper, we propose a novel sampling method, the thermostat-assisted continuously-tempered Hamiltonian Monte Carlo, for the purpose of multimodal Bayesian learning. It simulates a noisy dynamical system by incorporating both a continuously-varying tempering variable and the Nos\'e-Hoover thermostats. A significant benefit is that it is not only able to efficiently generate i.i.d.

  bayesian, name change, thermostat-assisted continuously-tempered hamiltonian monte carlo, (2 more...)


Reviews: Thermostat-assisted continuously-tempered Hamiltonian Monte Carlo for Bayesian learning

Neural Information Processing Systems

This paper presents a sampling method that combines Hamiltonian Monte Carlo (HMC), mini-batches, tempering, and thermostats, to more efficiently explore multimodal target distributions. It is demonstrated on a number of substantial neural network problems using real data sets. This is an interesting method, and the empirical results are quite substantial. Figure 2 does a nice job of demonstrating how the omission of any of the ingredients (e.g. the tempering, or the thermostat) is detrimental to the overall result, which is a nice illustration of how the combination works together well. This is followed by some substantial image classification examples.


Thermostat-assisted continuously-tempered Hamiltonian Monte Carlo for Bayesian learning

Luo, Rui, Wang, Jianhong, Yang, Yaodong, WANG, Jun, Zhu, Zhanxing

Neural Information Processing Systems

In this paper, we propose a novel sampling method, the thermostat-assisted continuously-tempered Hamiltonian Monte Carlo, for the purpose of multimodal Bayesian learning. It simulates a noisy dynamical system by incorporating both a continuously-varying tempering variable and the Nos\'e-Hoover thermostats. A significant benefit is that it is not only able to efficiently generate i.i.d. While the properties of the approach have been studied using synthetic datasets, our experiments on three real datasets have also shown its performance gains over several strong baselines for Bayesian learning with various types of neural networks plunged in. Papers published at the Neural Information Processing Systems Conference.


Thermostat-assisted continuously-tempered Hamiltonian Monte Carlo for Bayesian learning

Luo, Rui, Wang, Jianhong, Yang, Yaodong, WANG, Jun, Zhu, Zhanxing

Neural Information Processing Systems

In this paper, we propose a novel sampling method, the thermostat-assisted continuously-tempered Hamiltonian Monte Carlo, for the purpose of multimodal Bayesian learning. It simulates a noisy dynamical system by incorporating both a continuously-varying tempering variable and the Nos\'e-Hoover thermostats. A significant benefit is that it is not only able to efficiently generate i.i.d. samples when the underlying posterior distributions are multimodal, but also capable of adaptively neutralising the noise arising from the use of mini-batches. While the properties of the approach have been studied using synthetic datasets, our experiments on three real datasets have also shown its performance gains over several strong baselines for Bayesian learning with various types of neural networks plunged in.


Thermostat-assisted continuously-tempered Hamiltonian Monte Carlo for Bayesian learning

Luo, Rui, Wang, Jianhong, Yang, Yaodong, WANG, Jun, Zhu, Zhanxing

Neural Information Processing Systems

In this paper, we propose a novel sampling method, the thermostat-assisted continuously-tempered Hamiltonian Monte Carlo, for the purpose of multimodal Bayesian learning. It simulates a noisy dynamical system by incorporating both a continuously-varying tempering variable and the Nos\'e-Hoover thermostats. A significant benefit is that it is not only able to efficiently generate i.i.d. samples when the underlying posterior distributions are multimodal, but also capable of adaptively neutralising the noise arising from the use of mini-batches. While the properties of the approach have been studied using synthetic datasets, our experiments on three real datasets have also shown its performance gains over several strong baselines for Bayesian learning with various types of neural networks plunged in.